66 research outputs found

    Memory effects can make the transmission capability of a communication channel uncomputable

    Full text link
    Most communication channels are subjected to noise. One of the goals of Information Theory is to add redundancy in the transmission of information so that the information is transmitted reliably and the amount of information transmitted through the channel is as large as possible. The maximum rate at which reliable transmission is possible is called the capacity. If the channel does not keep memory of its past, the capacity is given by a simple optimization problem and can be efficiently computed. The situation of channels with memory is less clear. Here we show that for channels with memory the capacity cannot be computed to within precision 1/5. Our result holds even if we consider one of the simplest families of such channels -information-stable finite state machine channels-, restrict the input and output of the channel to 4 and 1 bit respectively and allow 6 bits of memory.Comment: Improved presentation and clarified claim

    Efficient reconciliation with rate adaptive codes in quantum key distribution.

    Full text link
    Quantum key distribution (QKD) relies on quantum and classical procedures in order to achieve the growing of a secret random string ¿the key¿ known only to the two parties executing the protocol. Limited intrinsic efficiency of the protocol, imperfect devices and eavesdropping produce errors and information leakage from which the set of measured signals ¿the raw key¿ must be stripped in order to distill a final, information theoretically secure, key. The key distillation process is a classical one in which basis reconciliation, error correction and privacy amplification protocols are applied to the raw key. This cleaning process is known as information reconciliation and must be done in a fast and efficient way to avoid cramping the performance of the QKD system. Brassard and Salvail proposed a very simple and elegant protocol to reconcile keys in the secret- key agreement context, known as Cascade, that has become the de-facto standard for all QKD practical implementations. However, it is highly interactive, requiring many com- munications between the legitimate parties and its efficiency is not optimal, imposing an early limit to the maximum tolerable error rate. In this paper we describe a low-density parity-check reconciliation protocol that improves significantly on these problems. The protocol exhibits better efficiency and limits the number of uses of the communications channel. It is also able to adapt to different error rates while remaining efficient, thus reaching longer distances or higher secure key rate for a given QKD system

    Improved construction of irregular progressive edge-growth Tanner graphs

    Get PDF
    The progressive edge-growth algorithm is a well-known procedure to construct regular and irregular low-density parity-check codes. In this paper, we propose a modification of the original algorithm that improves the performance of these codes in the waterfall region when constructing codes complying with both, check and symbol node degree distributions. The proposed algorithm is thus interesting if a family of irregular codes with a complex check node degree distribution is used.Comment: 3 pages, 3 figure

    Untainted Puncturing for Irregular Low-Density Parity-Check Codes

    Get PDF
    Puncturing is a well-known coding technique widely used for constructing rate-compatible codes. In this paper, we consider the problem of puncturing low-density parity-check codes and propose a new algorithm for intentional puncturing. The algorithm is based on the puncturing of untainted symbols, i.e. nodes with no punctured symbols within their neighboring set. It is shown that the algorithm proposed here performs better than previous proposals for a range of coding rates and short proportions of punctured symbols.Comment: 4 pages, 3 figure

    Blind Reconciliation

    Get PDF
    Information reconciliation is a crucial procedure in the classical post-processing of quantum key distribution (QKD). Poor reconciliation efficiency, revealing more information than strictly needed, may compromise the maximum attainable distance, while poor performance of the algorithm limits the practical throughput in a QKD device. Historically, reconciliation has been mainly done using close to minimal information disclosure but heavily interactive procedures, like Cascade, or using less efficient but also less interactive -just one message is exchanged- procedures, like the ones based in low-density parity-check (LDPC) codes. The price to pay in the LDPC case is that good efficiency is only attained for very long codes and in a very narrow range centered around the quantum bit error rate (QBER) that the code was designed to reconcile, thus forcing to have several codes if a broad range of QBER needs to be catered for. Real world implementations of these methods are thus very demanding, either on computational or communication resources or both, to the extent that the last generation of GHz clocked QKD systems are finding a bottleneck in the classical part. In order to produce compact, high performance and reliable QKD systems it would be highly desirable to remove these problems. Here we analyse the use of short-length LDPC codes in the information reconciliation context using a low interactivity, blind, protocol that avoids an a priori error rate estimation. We demonstrate that 2x10^3 bits length LDPC codes are suitable for blind reconciliation. Such codes are of high interest in practice, since they can be used for hardware implementations with very high throughput.Comment: 22 pages, 8 figure

    Superadditivity of Private Information for Any Number of Uses of the Channel.

    Get PDF
    The quantum capacity of a quantum channel is always smaller than the capacity of the channel for private communication. Both quantities are given by the infinite regularization of the coherent and the private information, respectively, which makes their evaluation very difficult. Here, we construct a family of channels for which the private and coherent information can remain strictly superadditive for unbounded number of uses, thus demonstrating that the regularization is necessary. We prove this by showing that the coherent information is strictly larger than the private information of a smaller number of uses of the channel. This implies that even though the quantum capacity is upper bounded by the private capacity, the nonregularized quantities can be interleaved.SS acknowledges the support of Sidney Sussex College and European Union under project QALGO (Grant Agreement No. 600700). DE acknowledges financial support from the European CHIST-ERA project CQC (funded partially by MINECO grant PRI-PIMCHI- 2011-1071) and from Comunidad de Madrid (grant QUITEMAD+-CM, ref. S2013/ICE-2801). This work has been partially supported by STW, QuTech and by the project HyQuNet (Grant No. TEC2012-35673), funded by Ministerio de Economia y Competitividad (MINECO), Spain. This work was made possible through the support of grant #48322 from the John Templeton Foundation.This is the author accepted manuscript. The final version of the article is available from APS at http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.115.04050
    • …
    corecore